home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
Hackers Underworld 2: Forbidden Knowledge
/
Hackers Underworld 2: Forbidden Knowledge.iso
/
LEGAL
/
EFF402.TXT
< prev
next >
Wrap
Text File
|
1994-07-17
|
29KB
|
558 lines
////////////// //////////////// //////////////
//// //// ////
_________ /////////________ /////////_______ /////////________________
//// //// ////
////////////////// //// ////
//////////////////////////////////////////////////////////////////////
EFFector Online 4.2 12/17/1992 editors@eff.org
A Publication of the Electronic Frontier Foundation ISSN 1062-9424
-==--==--==-<>-==--==--==-
MEGATRENDS OR MEGAMISTAKES?
What Ever Happened to the Information Society?
(Part 2 of 2 Parts.) Part 1 was published in EFFector Online 4.1)
by Tom Forester, Senior Lecturer,
School of Computing & Information Technology,
Griffith University, Queensland, Australia
[Continued from EFFector Online 4.1]
UNINTENDED CONSEQUENCES
NEW SOCIAL VULNERABILITIES
The IT revolution has created a whole new range of problems for
society - problems which were largely unexpected. Some arise from
the propensity of computers to malfunction, others arise from their
misuse by humans.
As complex industrial societies become more dependent on computers,
they become more vulnerable to technological failure because
computers have often proved to be unreliable, insecure and
unmanageable. Malfunctioning hardware and software is much more
common than many (especially those in the computer industry!) would
have us believe. There is little doubt that we put too much faith in
these supposedly-infallible machines. Computers are permeating almost
every aspect of our lives, but unlike other pervasive technologies
such as electricity, television and the motor car, computers are on
the whole less reliable and less predictable in their behaviour. This
is because they are discrete state digital electronic devices which
are prone to total and catastrophic failure. Computer systems, when
they are "down," are completely down, unlike analog or mechanical
devices which may only be partially down and are thus still partially
usable.
Popular areas for computer malfunctions include telephone billing and
telephone switching software, bank statements and bank teller
machines, electronic funds transfer systems and motor vehicle licence
databases. Industrial robots have been known to go berserk, while
heart pacemakers and automatic garage door openers have been rendered
useless by electro-magnetic radiation or "electronic smog" emitted
from point-of-sale terminals, personal computers and video games.
Although computers have often taken the "blame" on these occasions,
the ultimate cause of failure in most cases is, in fact, human error.
The cost of all this downtime is huge: for example, it has been
reported that British businesses suffer around 30 major mishaps a
year, involving losses of millions of pounds. The cost of software
failures alone in the UK is conservatively estimated at $900 million
per year (Woolnough 1988). In 1989, a British Computer Society
committee reported that much software was now so complex that current
skills in safety assessment were inadequate and therefore the safety
of people could not be guaranteed (Mellor 1989).
Computers enable enormous quantities of information to be stored,
retrieved and transmitted at great speed on a scale not possible
before. This is all very well, but it has serious implications for
data security and personal privacy because computer networks are
inherently insecure. The recent activities of hackers and data
thieves in the US, Germany and Britain have shown how all-too-easy it
still is to break into even the most sophisticated financial and
military systems. Malicious virus creators have wreaked havoc on
important academic and government communication networks. The list of
scams perpetrated by the new breed of high-tech criminals, ranging
from airline ticket reservation fraud to the reprogramming of the
chips inside mobile phones, is growing daily. Some people have had
their careers and lives ruined by unauthorized users gaining access
to supposedly-confidential databases containing medical, financial
and criminal records.
Computer systems are often incredibly complex - so complex, in fact,
that they are not always understood even by their creators (although
few are willing to admit it!). This often makes them completely
unmanageable. Unmanageable complexity can result in massive foul-ups
or spectacular budget "runaways." For example, Bank of America in
1988 had to abandon a $20 million computer system after spending five
years and a further $60 million trying to make it work! Allstate
Insurance saw the cost of its new system rise from $8 million to a
staggering $100 million and estimated completion delayed from 1987 to
1993! Moreover, the problem seems to be getting worse: in 1988 the
American Arbitration Association took on 190 computer disputes, most
of which involved defective systems. The claims totalled $200 million
- up from only $31 million in 1984.
Complexity can also result in disaster: no computer is 100 per cent
guaranteed because it is virtually impossible to anticipate all
sources of failure. Yet computers are regularly being used for all
sorts of critical applications such as saving lives, flying aircraft,
running nuclear power stations, transferring vast sums of money and
controlling missile systems - and this can sometimes have tragic
consequences. For example, between 1982 and 1987, some 22 US
servicemen died in five separate crashes of the USAF's sophisticated
Blackhawk helicopter before the problem was traced to its computer-
based 'fly-by-wire' system (Forester and Morrison 1990). At least two
people were killed after receiving overdoses of radiation
administered by the computerized Therac 25 X-ray machines, and there
are many other examples of computer foul-ups causing death and injury
(Forester and Morrison 1990).
Just to rub it in, I should also point out that computer systems are
equally vulnerable to fires, floods, earthquakes and even quite
short power outages or voltage drops caused by "dirty power", as well
as attacks by outside hackers and sabotage from inside employees. For
example, in Chicago in 1986, a disgruntled employee at Encyclopedia
Britannica , angry at having been laid-off, merely tapped into the
encyclopedia's database and made a few alterations to the text being
prepared for a new edition of the renowned work - like changing
references to Jesus Christ to Allah and inserting the names of
company executives in odd positions. As one executive commented, "In
the computer age, this is exactly what we have nightmares about".
A year later, another saboteur shut down the entire National
Association of Securities Dealers' automatic quotation service
(NASDAQ) for 82 minutes, keeping 20 million shares from being traded.
The saboteur in question was an adventurous squirrel, who had caused
a short circuit in Trumbull, Connecticut, where NASDAQ's main
computer is situated. In Australia, foxes have taken to digging up
new optical fibre cables to eat the plastic cover, while sharks have
been doing the same to submarine fibre optic telephone cables on the
floor of the Pacific ocean. In Denmark, a strike by 600 computer
personnel paralysed the government for four months in 1987, causing
the ruling party to call an early general election (UPI 1987), while
in the same year an Australian saboteur carefully severed 24 cables
in a Sydney tunnel and knocked out 35,000 telephone, fax and point-
of-sale lines, putting hundreds of businesses in 40 suburbs out of
action for up to 48 hours (The Australian, 23 November 1987, page 1).
As society becomes more dependent on computers, we also become more
vulnerable to the misuse of computers by human beings. The theft of
cop